Robust Forward Algorithms via PAC-Bayes and Laplace Distributions
نویسندگان
چکیده
Laplace random variables are commonly used to model extreme noise in many fields, while systems trained to deal with such noises are often characterized by robustness properties. We introduce new learning algorithms that minimize objectives derived directly from PAC-Bayes bounds, incorporating Laplace distributions. The resulting algorithms are regulated by the Huber loss function and are robust to noise, as the Laplace distribution integrated large deviation of parameters. We analyze the convexity properties of the objective, and propose a few bounds which are fully convex, two of which jointly convex in the mean and standard-deviation under certain conditions. We derive new forward algorithms analogous to recent boosting algorithms, providing novel relations between boosting and PAC-Bayes analysis. Experiments show that our algorithms outperform AdaBoost, L1-LogBoost [10], and RobustBoost [11] in a wide range of input noise.
منابع مشابه
Some Discriminant-Based PAC Algorithms
A classical approach in multi-class pattern classification is the following. Estimate probability distributions that generated the observations for each label class, and then label new instances by applying the Bayes classifier to the estimated distributions. That approach provides more useful information than just a class label; it also provides estimates of the conditional distribution of cla...
متن کاملBayesian Logistic Regression Model Choice via Laplace-Metropolis Algorithm
Following a Bayesian statistical inference paradigm, we provide an alternative methodology for analyzing a multivariate logistic regression. We use a multivariate normal prior in the Bayesian analysis. We present a unique Bayes estimator associated with a prior which is admissible. The Bayes estimators of the coefficients of the model are obtained via MCMC methods. The proposed procedure...
متن کاملPAC Classification based on PAC Estimates of Label Class Distributions
A standard approach in pattern classification is to estimate the distributions of the label classes, and then to apply the Bayes classifier to the estimates of the distributions in order to classify unlabeled examples. As one might expect, the better our estimates of the label class distributions, the better the resulting classifier will be. In this paper we make this observation precise by ide...
متن کاملEMPIRICAL BAYES ANALYSIS OF TWO-FACTOR EXPERIMENTS UNDER INVERSE GAUSSIAN MODEL
A two-factor experiment with interaction between factors wherein observations follow an Inverse Gaussian model is considered. Analysis of the experiment is approached via an empirical Bayes procedure. The conjugate family of prior distributions is considered. Bayes and empirical Bayes estimators are derived. Application of the procedure is illustrated on a data set, which has previously been an...
متن کاملBayes, E-Bayes and Robust Bayes Premium Estimation and Prediction under the Squared Log Error Loss Function
In risk analysis based on Bayesian framework, premium calculation requires specification of a prior distribution for the risk parameter in the heterogeneous portfolio. When the prior knowledge is vague, the E-Bayesian and robust Bayesian analysis can be used to handle the uncertainty in specifying the prior distribution by considering a class of priors instead of a single prior. In th...
متن کامل